skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Gray, Colin M"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. As deceptive design practices proliferate on technology platforms and increasingly threaten user agency and well-being, concerned online communities are using social media platforms to discuss and challenge these unethical practices. In this paper, we conducted a case study analysis of two subreddits, r/privacy and r/assholedesign, to investigate the kinds of ethical concerns expressed within both subreddits, the strategies employed to express those concerns, the goals participants hoped to achieve through participation, and the community infrastructure these communities created to support their collective action against technology manipulation. Our findings show that posts on these subreddits employ different strategies to discuss ethical and value-related issues, revealing instances where community members engage in ethics ''by other means''-raising attention to problematic practices, identifying workarounds, and encouraging activism. The findings also showed that members of both communities transformed individual frustrations with technology manipulation into collective ethical action, involving value contestation and interaction criticism of problematic technology artifacts mediated by the socio-technical community infrastructure they designed to support these objectives. We conclude by highlighting opportunities for CSCW scholars to further encourage user engagement with technology ethics concepts by considering the role of community infrastructure and the different rhetoric of ethics that express different combinations of values and desired outcomes. 
    more » « less
    Free, publicly-accessible full text available October 18, 2026
  2. The EU ePrivacy Directive requires consent before using cookies or other tracking technologies, while the EU General Data Protection Regulation (“GDPR”) sets high-level and principle-based requirements for such consent to be valid. However, the translation of such requirements into concrete design interfaces for consent banners is far from straightforward. This situation has given rise to the use of manipulative tactics in user experience (“UX”), commonly known as dark patterns, which influence users’ decision-making and may violate the GDPR requirements for valid consent. To address this problem, EU regulators aim to interpret GDPR requirements and to limit the design space of consent banners within their guidelines. Academic researchers from various disciplines address the same problem by performing user studies to evaluate the impact of design and dark patterns on users’ decision making. Regrettably, the guidelines and user studies rarely impact each other. In this Essay, we collected and analyzed seventeen official guidelines issued by EU regulators and the EU Data Protection Board (“EDPB”), as well as eleven consent-focused empirical user studies which we thoroughly studied from a User Interface (“UI”) design perspective. We identified numerous gaps between consent banner designs recommended by regulators and those evaluated in user studies. By doing so, we contribute to both the regulatory discourse and future user studies. We pinpoint EU regulatory inconsistencies and provide actionable recommendations for regulators. For academic scholars, we synthesize insights on design elements discussed by regulators requiring further user study evaluations. Finally, we recommend that EDPB and EU regulators, alongside usability, Human-Computer Interaction (“HCI”), and design researchers, engage in transdisciplinary dialogue in order to close the gap between EU guidelines and user studies. 
    more » « less
  3. Deceptive and coercive design practices are increasingly used by companies to extract profit, harvest data, and limit consumer choice. Dark patterns represent the most common contemporary amalgamation of these problematic practices, connecting designers, technologists, scholars, regulators, and legal professionals in transdisciplinary dialogue. However, a lack of universally accepted definitions across the academic, legislative, practitioner, and regulatory space has likely limited the impact that scholarship on dark patterns might have in supporting sanctions and evolved design practices. In this paper, we seek to support the development of a shared language of dark patterns, harmonizing ten existing regulatory and academic taxonomies of dark patterns and proposing a three-level ontology with standardized definitions for 64 synthesized dark pattern types across low-, meso-, and high-level patterns. We illustrate how this ontology can support translational research and regulatory action, including transdisciplinary pathways to extend our initial types through new empirical work across application and technology domains. 
    more » « less
  4. Ethics as embodied by technology practitioners resists simple definition—particularly as it relates to the interplay of identity, organizational, and professional complexity. In this paper we use the linguistic notion of languaging as an analytic lens to describe how technology and design practitioners negotiate their conception of ethics as they reflect upon their everyday work. We engaged twelve practitioners in individual co-creation workshops, encouraging them to reflect on their ethical role in their everyday work through a series of generative and evaluative activities. We analyzed these data to identify how each practitioner reasoned about ethics through language and artifacts, finding that practitioners used a range of rhetorical tropes to describe their ethical commitments and beliefs in ways that were complex and sometimes contradictory. Across three cases, we describe how ethics was negotiated through language across three key zones of ecological emergence: the practitioner’s “core” beliefs about ethics, internal and external ecological elements that shaped or mediated these core beliefs, and the ultimate boundaries they reported refusing to cross. Building on these findings, we describe how the languaging of ethics reveals opportunities to definitionally and practically engage with ethics in technology ethics research, practice, and education. 
    more » « less
  5. Deceptive, manipulative, and coercive practices are deeply embedded in our digital experiences, impacting our ability to make informed choices and undermining our agency and autonomy. These design practices—collectively known as “dark patterns” or “deceptive patterns”—are increasingly under legal scrutiny and sanctions, largely due to the efforts of human-computer interaction scholars that have conducted pioneering research relating to dark patterns types, definitions, and harms. In this workshop, we continue building this scholarly community with a focus on organizing for action. Our aims include: (i) building capacity around specific research questions relating to methodologies for detection; (ii) characterization of harms; and (iii) creating effective countermeasures. Through the outcomes of the workshop, we will connect our scholarship to the legal, design, and regulatory communities to inform further legislative and legal action. 
    more » « less
  6. Design and technology practitioners are increasingly aware of the ethical impact of their work practices, desiring tools to support their ethical awareness across a range of contexts. In this paper, we report on findings from a series of six co-creation workshops with 26 technology and design practitioners that supported their creation of a bespoke ethics-focused action plan. Using a qualitative content analysis and thematic analysis approach, we identified a range of roles and process moves that practitioners and design students with professional experience employed and illustrate the interplay of these elements that impacted the creation of their action plan and revealed aspects of their ethical design complexity. We conclude with implications for supporting ethics in socio-technical practice and opportunities for the further development of methods that support ethical engagement and are resonant with the realities of practice. 
    more » « less
  7. The development of Artificial Intelligence (AI) systems involves a significant level of judgment and decision making on the part of engineers and designers to ensure the safety, robustness, and ethical design of such systems. However, the kinds of judgments that practitioners employ while developing AI platforms are rarely foregrounded or examined to explore areas practitioners might need ethical support. In this short paper, we employ the concept of design judgment to foreground and examine the kinds of sensemaking software engineers use to inform their decisionmaking while developing AI systems. Relying on data generated from two exploratory observation studies of student software engineers, we connect the concept of fairness to the foregrounded judgments to implicate their potential algorithmic fairness impacts. Our findings surface some ways in which the design judgment of software engineers could adversely impact the downstream goal of ensuring fairness in AI systems. We discuss the implications of these findings in fostering positive innovation and enhancing fairness in AI systems, drawing attention to the need to provide ethical guidance, support, or intervention to practitioners as they engage in situated and contextual judgments while developing AI systems. 
    more » « less